17 research outputs found

    Generalized Random Gilbert-Varshamov Codes

    Get PDF
    © 1963-2012 IEEE. We introduce a random coding technique for transmission over discrete memoryless channels, reminiscent of the basic construction attaining the Gilbert-Varshamov bound for codes in Hamming spaces. The code construction is based on drawing codewords recursively from a fixed type class, in such a way that a newly generated codeword must be at a certain minimum distance from all previously chosen codewords, according to some generic distance function. We derive an achievable error exponent for this construction and prove its tightness with respect to the ensemble average. We show that the exponent recovers the Csiszár and Körner exponent as a special case, which is known to be at least as high as both the random-coding and expurgated exponents, and we establish the optimality of certain choices of the distance function. In addition, for additive distances and decoding metrics, we present an equivalent dual expression, along with a generalization to infinite alphabets via cost-constrained random coding.ER

    Asymptotically false-positive-maximizing attack on non-binary Tardos codes

    Full text link
    We use a method recently introduced by Simone and Skoric to study accusation probabilities for non-binary Tardos fingerprinting codes. We generalize the pre-computation steps in this approach to include a broad class of collusion attack strategies. We analytically derive properties of a special attack that asymptotically maximizes false accusation probabilities. We present numerical results on sufficient code lengths for this attack, and explain the abrupt transitions that occur in these results

    A Counter-Example to the Mismatched Decoding Converse for Binary-Input Discrete Memoryless Channels

    Get PDF
    This paper studies the mismatched decoding problem for binary-input discrete memoryless channels. An example is provided for which an achievable rate based on superposition coding exceeds the LM rate (Hui, 1983; Csisz\'ar-K\"orner, 1981), thus providing a counter-example to a previously reported converse result (Balakirsky, 1995). Both numerical evaluations and theoretical results are used in establishing this claim.Comment: Extended version of paper accepted to IEEE Transactions on Information Theory; rate derivation and numerical algorithms included in appendice

    A Recursive Cost-Constrained Construction that Attains the Expurgated Exponent

    No full text
    We show that a recursive cost-constrained random coding scheme attains an error exponent that is at least as high as both the random-coding exponent and the expurgated exponent. The random coding scheme enforces that every pair of codewords in the codebook meets a minimum distance condition, and is reminiscent of the Gilbert-Varshamov construction, but with the notable feature of permitting continuous-alphabet channels. The distance function is initially arbitrary, and it is shown that the Chernoff/Bhattacharrya distance suffices to attain the random coding and expurgated exponents

    The Error Exponent of Generalized Random-Gilbert Varshamov Codes

    No full text
    We introduce a random code construction for channel coding in which the codewords are constrained to be well-separated according to a given distance function, analogously to an existing construction attaining the Gilbert-Varshamov bound. We derive an achievable error exponent for this construction, and prove its tightness with respect to the ensemble average. We show that the exponent recovers the Csiszár and Körner exponent as a special case by choosing the distance function to be the negative of the empirical mutual information. We further establish the optimality of this distance function with respect to the exponent of the random coding scheme

    The error exponent of random gilbert-varshamov codes

    No full text
    We consider transmission over a discrete memoryless channel (DMC) W(y\x) with finite alphabets X and Y. It is assumed that an (n, Mn)-codebook Mn = [x1,..., xMn} with rate Rn = 1/n log Mn is used for transmission. The type-dependent maximum-metric decoder estimates the transmitted message as m = arg maxxiMn q(Pxi, y), (1) where xy is the joint empirical distribution [1, Ch. 2] of the pair (x, y) and the metric q : P(X × Y) → R is continuous. Maximum-likelihood (ML) decoding is a special case of (1), but the decoder may in general be mismatched [2], [3]

    Generalized Random Gilbert-Varshamov Codes

    No full text
    © 1963-2012 IEEE. We introduce a random coding technique for transmission over discrete memoryless channels, reminiscent of the basic construction attaining the Gilbert-Varshamov bound for codes in Hamming spaces. The code construction is based on drawing codewords recursively from a fixed type class, in such a way that a newly generated codeword must be at a certain minimum distance from all previously chosen codewords, according to some generic distance function. We derive an achievable error exponent for this construction and prove its tightness with respect to the ensemble average. We show that the exponent recovers the Csiszár and Körner exponent as a special case, which is known to be at least as high as both the random-coding and expurgated exponents, and we establish the optimality of certain choices of the distance function. In addition, for additive distances and decoding metrics, we present an equivalent dual expression, along with a generalization to infinite alphabets via cost-constrained random coding
    corecore